Sparsifying the Fisher Linear Discriminant by Rotation.
نویسندگان
چکیده
Many high dimensional classification techniques have been proposed in the literature based on sparse linear discriminant analysis (LDA). To efficiently use them, sparsity of linear classifiers is a prerequisite. However, this might not be readily available in many applications, and rotations of data are required to create the needed sparsity. In this paper, we propose a family of rotations to create the required sparsity. The basic idea is to use the principal components of the sample covariance matrix of the pooled samples and its variants to rotate the data first and to then apply an existing high dimensional classifier. This rotate-and-solve procedure can be combined with any existing classifiers, and is robust against the sparsity level of the true model. We show that these rotations do create the sparsity needed for high dimensional classifications and provide theoretical understanding why such a rotation works empirically. The effectiveness of the proposed method is demonstrated by a number of simulated and real data examples, and the improvements of our method over some popular high dimensional classification rules are clearly shown.
منابع مشابه
Robust Fisher Discriminant Analysis
Fisher linear discriminant analysis (LDA) can be sensitive to the problem data. Robust Fisher LDA can systematically alleviate the sensitivity problem by explicitly incorporating a model of data uncertainty in a classification problem and optimizing for the worst-case scenario under this model. The main contribution of this paper is show that with general convex uncertainty models on the proble...
متن کاملRapid and Brief Communication Alternative linear discriminant classi$er
Fisher linear discriminant analysis (FLDA) $nds a set of optimal discriminating vectors by maximizing Fisher criterion, i.e., the ratio of the between scatter to the within scatter. One of its major disadvantages is that the number of its discriminating vectors capable to be found is bounded from above by C-1 for C-class problem. In this paper for binary-class problem, we propose alternative FL...
متن کاملA Model Classification Technique for Linear Discriminant Analysis for Two Groups
Linear discriminant analysis introduced by Fisher is a known dimension reduction and classification approach that has received much attention in the statistical literature. Most researchers have focused attention on its deficiencies. As such different versions of classification procedures have been introduced for various applications. In this paper, we attempt not to robustify the Fisher linear...
متن کاملError bounds for Kernel Fisher Linear Discriminant in Gaussian Hilbert space
We give a non-trivial, non-asymptotic upper bound on the classification error of the popular Kernel Fisher Linear Discriminant classifier under the assumption that the kernelinduced space is a Gaussian Hilbert space.
متن کاملMulticlass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
ÐWe derive a class of computationally inexpensive linear dimension reduction criteria by introducing a weighted variant of the well-known K-class Fisher criterion associated with linear discriminant analysis (LDA). It can be seen that LDA weights contributions of individual class pairs according to the Euclidian distance of the respective class means. We generalize upon LDA by introducing a dif...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Journal of the Royal Statistical Society. Series B, Statistical methodology
دوره 77 4 شماره
صفحات -
تاریخ انتشار 2015